🇬🇧 en de 🇩🇪

Markov chain noun

  • (probability theory) A discrete-time stochastic process with the Markov property.
Markow-Kette
Wiktionary Links